Fast Krylov Methods for N-Body Learning
نویسندگان
چکیده
This paper addresses the issue of numerical computation in machine learning domains based on similarity metrics, such as kernel methods, spectral techniques and Gaussian processes. It presents a general solution strategy based on Krylov subspace iteration and fast N-body learning methods. The experiments show significant gains in computation and storage on datasets arising in image segmentation, object detection and dimensionality reduction. The paper also presents theoretical bounds on the stability of these methods.
منابع مشابه
Fast Krylov Methods for Clustering
At the heart of unsupervised clustering and semi-supervised clustering is the calculation of matrix eigenvalues(eigenvectors) or matrix inversion. In generally, its complexity is O(N). By using Krylov Subspace Methods and Fast Methods, we improve the performance to O(NlogN). We also make a thorough evaluation of errors introduced by the fast algorithm.
متن کاملCombining Conjugate Direction Methods with Stochastic Approximation of Gradients
The method of conjugate directions provides a very effective way to optimize large, deterministic systems by gradient descent. In its standard form, however, it is not amenable to stochastic approximation of the gradient. Here we explore ideas from conjugate gradient in the stochastic (online) setting, using fast Hessian-gradient products to set up low-dimensional Krylov subspaces within indivi...
متن کاملFast Spectral Learning using Lanczos Eigenspace Projections
The core computational step in spectral learning – finding the projection of a function onto the eigenspace of a symmetric operator, such as a graph Laplacian – generally incurs a cubic computational complexity O(N). This paper describes the use of Lanczos eigenspace projections for accelerating spectral projections, which reduces the complexity to O(nTop + nN) operations, where n is the number...
متن کاملFast semi-supervised discriminant analysis for binary classification of large data-sets
High-dimensional data requires scalable algorithms. We propose and analyze three scalable and related algorithms for semi-supervised discriminant analysis (SDA). These methods are based on Krylov subspace methods which exploit the data sparsity and the shift-invariance of Krylov subspaces. In addition, the problem definition was improved by adding centralization to the semi-supervised setting. ...
متن کاملConjugate Directions for Stochastic Gradient Descent
The method of conjugate gradients provides a very effective way to optimize large, deterministic systems by gradient descent. In its standard form, however, it is not amenable to stochastic approximation of the gradient. Here we explore ideas from conjugate gradient in the stochastic (online) setting, using fast Hessian-gradient products to set up low-dimensional Krylov subspaces within individ...
متن کامل